Improving implementation of linear discriminant analysis for the high dimension/small sample size problem
نویسندگان
چکیده
Classification based on Fisher’s Linear Discriminant Analysis (FLDA) is challenging when the number of variables largely exceeds the number of given samples: The original FLDA needs to be carefully modified and with high dimensionality implementational issues like reduction of storage costs are of crucial importance. In this paper we first review popular methods for the high dimension/small sample size problem and choose the one closest, in some sense, to the classical regular approach. Then we improve its implementation with regards to computational and storage costs and numerical stability. This is achieved through combining a variety of known and new implementation strategies. The paper demonstrates in experiments the superiority, with respect to both overall costs and classification rates, of the resulting algorithm compared with other methods.
منابع مشابه
A Multi Linear Discriminant Analysis Method Using a Subtraction Criteria
Linear dimension reduction has been used in different application such as image processing and pattern recognition. All these data folds the original data to vectors and project them to an small dimensions. But in some applications such we may face with data that are not vectors such as image data. Folding the multidimensional data to vectors causes curse of dimensionality and mixed the differe...
متن کاملA Novel Nonparametric Linear Discriminant Analysis for High-Dimensional Data Classification
Linear discriminant analysis (LDA) has played an important role for dimension reduction in patter recognition field. Basically, LDA has three deficiencies in dealing with classification problems. First, LDA is well-suited only for normally distributed data. Second, the number of features can be extracted are limited by the rank of between-class scatter matrix. Third, the singularity problem ari...
متن کاملFeature reduction of hyperspectral images: Discriminant analysis and the first principal component
When the number of training samples is limited, feature reduction plays an important role in classification of hyperspectral images. In this paper, we propose a supervised feature extraction method based on discriminant analysis (DA) which uses the first principal component (PC1) to weight the scatter matrices. The proposed method, called DA-PC1, copes with the small sample size problem and has...
متن کاملAsymptotic Theory for Discriminant Analysis in High Dimension Low Sample Size
This paper is based on the author’s thesis, “Pattern recognition based on naive canonical correlations in high dimension low sample size”. This paper is concerned with discriminant analysis for multi-class problems in a High Dimension Low Sample Size (hdlss) context. The proposed discrimination method is based on canonical correlations between the predictors and response vector of class label. ...
متن کاملDesign and Implementation of Robust 2D Face Recognition System for Illumination Variations
Illumination variation is a challenging problem in face recognition research area. Same person can appear greatly different under varying lighting conditions. This paper consists of Face Recognition System which is invariant to illumination variations. Face recognition system which uses Linear Discriminant Analysis (LDA) as feature extractor have Small Sample Size (SSS). It consists of implemen...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Computational Statistics & Data Analysis
دوره 52 شماره
صفحات -
تاریخ انتشار 2007